44. Other Activation Functions
Other Activation Functions
Other activation functions in Keras
Changing activation functions is easy. So far we've been using sigmoid (softmax if there's more than one class, which is the case when we one-hot encode the output), which is added to the layer as follows:
model.add(Activation('sigmoid'))
or
model.add(Activation('softmax'))
If we want to use relu or tanh, we just specify the name of the layer as relu or tanh:
model.add(Activation('relu'))
model.add(Activation('tanh'))